5 Future Work 4.2 Learning with an Arbitrary Output Unit: Max K-f Consistency

نویسندگان

  • Viggo Kann
  • Sanjeev Khanna
  • Jens Lagergren
  • Wee Sun Lee
  • Peter L. Bartlett
  • Michael I. Jordan
  • Michael J. Kearns
چکیده

agnostic learning of neural networks with bounded fan-in. It would be interesting to extend the hardness result for networks with real outputs to the case of a linear output unit with a constraint on the size of the output weights. We conjecture that a similar result can be obtained, with a relative error bound that|unlike Vu's result for this case 19]|does not decrease as the input dimension increases. It would also be worthwhile to extend the results to show that it is diicult to nd a hypothesis that has expected loss nearly minimal over some neural network class, whatever hypothesis class is used. There is some related work in this direction. Theorem 7 in 3] shows that nding a conjunction of k 0 linear threshold functions that correctly classiies a set that can be correctly classiied by a conjunction of k linear threshold functions is as hard as colouring a k-colourable graph with n vertices using k 0 colours, which has since been shown to be hard for k 0 = O(kn) for some > 0 15]. The cryptographic results mentioned in Section 1 do not have such strong restrictions on the hypothesis class, but only apply to classes that are apparently considerably richer than the neural network classes studied in this paper. It is easy to show that the result is also true if the components of vectors in S in and S out must be ratios of integers, provided the integers are allowed to be as large as ck 2 , for some constant c. Hence, for each k, the number of bits needed to represent S is linear in the size of the graph G. These lemmas show that we have an L-reduction from Max k-Cut for k-colourable graphs to Max k-F Consistency, with parameters = k=(k ? 1) and = 9(4k + 1), and this L-reduction preserves maximality. Combining this with Theorem 5 gives Theorem 2. 4.3 Learning with a sigmoid output unit: Max k-Consistency We give an L-reduction from Max k-F Consistency to Max k-Consistency , where F is the class of linear threshold functions. Given a sample S for a Max k-F Consistency problem, we use the same sample for the Max k-Consistency problem. Trivially 1 , if opt Max k-F Consistency (S) = 1 then opt Max k-Consistency (S) = 1. Furthermore, we have the following lemma. Lemma 7. For a solution f …

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hardness Results for General Two-Layer Neural Networks

We deal with the problem of learning a general class of 2-layer neural networks in polynomial time. The considered neural networks consist of k linear threshold units on the hidden layer and an arbitrary binary output unit. We show NP-completeness of the consistency problem for classes that use an arbitrary set of binary output units containing a function which depends on all input dimensions. ...

متن کامل

Comparing the Influence of Three Educational Methods on the Epidemiology of Occupational Diseases' learning Qualities

Background: Teaching epidemiology of occupational diseases is an important course for occupational health students. If these courses are taught with problem based learning or other new educational methods they will be more beneficial. The objective of this study was the determination of the effects of three educational methods on learning of epidemiology of occupational diseases. Methods: This ...

متن کامل

Stabilizing patterns in time: Neural network approach

Recurrent and feedback networks are capable of holding dynamic memories. Nonetheless, training a network for that task is challenging. In order to do so, one should face non-linear propagation of errors in the system. Small deviations from the desired dynamics due to error or inherent noise might have a dramatic effect in the future. A method to cope with these difficulties is thus needed. In t...

متن کامل

The Effects of Collaborative and Individual Output Tasks on Learning English Collocations

  One of the most problematic areas in foreign language learning is collocation. It is often seen as arbitrary and an overwhelming obstacle to the achievement of nativelike fluency. Current second language (L2) instruction research has encouraged the use of collaborative output tasks in L2 classrooms. This study examined the effects of two types of output tasks (editing and cloze) on the learni...

متن کامل

COUNTEREXAMPLES IN CHAOTIC GENERALIZED SHIFTS

‎In the following text for arbitrary $X$ with at least two elements‎, ‎nonempty countable set $Gamma$‎ ‎we make a comparative study on the collection of generalized shift dynamical systems like $(X^Gamma,sigma_varphi)$ where $varphi:GammatoGamma$ is an arbitrary self-map‎. ‎We pay attention to sub-systems and combinations of generalized shifts with counterexamples regarding Devaney‎, ‎exact Dev...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999